Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Code-Optimized Tokenizer
# Code-Optimized Tokenizer
Multilingual ModernBert Large Preview
MIT
A large multilingual BERT model developed by the Algomatic team, supporting 8192 context length, trained on approximately 60 billion tokens, suitable for mask filling tasks.
Large Language Model
M
makiart
27
2
Featured Recommended AI Models
Empowering the Future, Your AI Solution Knowledge Base
English
简体中文
繁體中文
にほんご
© 2025
AIbase